(working title)
Esben Lykke, PhD student
03 maj, 2023
It was likely a dead end from the get-go :(
Basic Features
ACC derived features1
Sensor-Independent Features2
Forger, Jewett, and Kronauer (1999): a so-called cubic van der Pol equation
\[\frac{dx_c}{dt}=\frac{\pi}{12}\begin{cases}\mu(x_c-\frac{4x^3}{3})-x\begin{bmatrix}(\frac{24}{0.99669\tau_x})^2+kB\end{bmatrix}\end{cases}\]
This thing is dependent on ambient light and body temperature!
Walch et al. (2019) incorporated this feature using step counts from the Apple Watch
But as demonstrated by Walch et al. (2019), a simple cosine function does the trick just as well :)
| Performance Metrics | ||
| Grouped by Epoch Length | ||
| 10 sec | 30 sec | |
|---|---|---|
| F1 Score | 93.02% | 92.63% |
| Accuracy | 94.25% | 93.88% |
| Sensitivity | 93.05% | 94.21% |
| Precision | 92.99% | 91.10% |
| Specificity | 95.09% | 93.65% |
| model | epoch_10 | epoch_30 |
|---|---|---|
| raw | ||
| Decision Tree | 0.79 | 0.78 |
| Logistic Regression | 0.78 | 0.78 |
| Neural Network | 0.81 | 0.80 |
| XGBoost | 0.67 | 0.70 |
| median_5 | ||
| Decision Tree | 0.73 | 0.76 |
| Logistic Regression | 0.80 | 0.81 |
| Neural Network | 0.82 | 0.81 |
| XGBoost | 0.86 | 0.82 |
| median_10 | ||
| Decision Tree | 0.74 | 0.78 |
| Logistic Regression | 0.81 | 0.82 |
| Neural Network | 0.84 | 0.83 |
| XGBoost | 0.88 | 0.83 |
| Performance Metrics | ||||||||
| 10 second epochs | 30 second epochs | |||||||
|---|---|---|---|---|---|---|---|---|
| Decision Tree | Logistic Regression | Neural Network | XGBoost | Decision Tree | Logistic Regression | Neural Network | XGBoost | |
| raw | ||||||||
| F1 Score | 89.87% | 89.26% | 89.95% | 89.16% | 90.03% | 89.86% | 90.26% | 88.73% |
| Accuracy | 83.04% | 81.94% | 83.31% | 82.06% | 83.28% | 82.74% | 83.68% | 81.69% |
| Sensitivity | 97.35% | 97.19% | 96.74% | 95.50% | 95.70% | 96.97% | 95.94% | 91.48% |
| Precision | 83.45% | 82.53% | 84.06% | 83.61% | 84.99% | 83.71% | 85.22% | 86.15% |
| Specificity | 34.45% | 30.15% | 37.71% | 36.44% | 37.03% | 29.73% | 38.00% | 45.24% |
| median_5 | ||||||||
| F1 Score | 91.67% | 91.08% | 91.91% | 92.61% | 92.79% | 92.78% | 93.08% | 93.28% |
| Accuracy | 85.53% | 84.49% | 86.07% | 87.33% | 87.27% | 87.09% | 87.80% | 88.16% |
| Sensitivity | 98.42% | 97.80% | 97.79% | 98.05% | 97.18% | 98.36% | 97.25% | 97.36% |
| Precision | 85.79% | 85.22% | 86.70% | 87.73% | 88.79% | 87.79% | 89.24% | 89.52% |
| Specificity | 30.81% | 28.03% | 36.34% | 41.81% | 34.06% | 26.52% | 37.02% | 38.75% |
| median_10 | ||||||||
| F1 Score | 92.26% | 91.69% | 92.64% | 93.38% | 92.74% | 92.89% | 93.27% | 93.31% |
| Accuracy | 86.47% | 85.45% | 87.23% | 88.58% | 87.16% | 87.27% | 88.09% | 88.20% |
| Sensitivity | 98.31% | 97.71% | 97.87% | 98.11% | 96.93% | 98.29% | 97.51% | 97.20% |
| Precision | 86.92% | 86.36% | 87.94% | 89.08% | 88.90% | 88.06% | 89.38% | 89.71% |
| Specificity | 32.21% | 29.30% | 38.49% | 44.89% | 33.37% | 26.63% | 36.22% | 38.61% |
::: {.panel-tabset}
https://github.com/esbenlykke/sleep_study